|
A Boltzmann machine is a type of stochastic recurrent neural network invented by Geoffrey Hinton and Terry Sejnowski in 1985. Boltzmann machines can be seen as the stochastic, generative counterpart of Hopfield nets. They were one of the first examples of a neural network capable of learning internal representations, and are able to represent and (given sufficient time) solve difficult combinatoric problems. They are theoretically intriguing because of the locality and Hebbian nature of their training algorithm, and because of their parallelism and the resemblance of their dynamics to simple physical processes. Due to a number of issues discussed below, Boltzmann machines with unconstrained connectivity have not proven useful for practical problems in machine learning or inference, but if the connectivity is properly constrained, the learning can be made efficient enough to be useful for practical problems. They are named after the Boltzmann distribution in statistical mechanics, which is used in their sampling function. ==Structure== A Boltzmann machine, like a Hopfield network, is a network of units with an "energy" defined for the network. It also has binary units, but unlike Hopfield nets, Boltzmann machine units are stochastic. The global energy, , in a Boltzmann machine is identical in form to that of a Hopfield network: : Where: * is the connection strength between unit and unit . * is the state, , of unit . * is the bias of unit in the global energy function. ( is the activation threshold for the unit.) The connections in a Boltzmann machine have two restrictions: * . (No unit has a connection with itself.) * . (All connections are symmetric.) Often the weights are represented in matrix form with a symmetric matrix , with zeros along the diagonal. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Boltzmann machine」の詳細全文を読む スポンサード リンク
|